824 research outputs found
A Federated Filtering Framework for Internet of Medical Things
Based on the dominant paradigm, all the wearable IoT devices used in the
healthcare sector also known as the internet of medical things (IoMT) are
resource constrained in power and computational capabilities. The IoMT devices
are continuously pushing their readings to the remote cloud servers for
real-time data analytics, that causes faster drainage of the device battery.
Moreover, other demerits of continuous centralizing of data include exposed
privacy and high latency. This paper presents a novel Federated Filtering
Framework for IoMT devices which is based on the prediction of data at the
central fog server using shared models provided by the local IoMT devices. The
fog server performs model averaging to predict the aggregated data matrix and
also computes filter parameters for local IoMT devices. Two significant
theoretical contributions of this paper are the global tolerable perturbation
error () and the local filtering parameter (); where the
former controls the decision-making accuracy due to eigenvalue perturbation and
the later balances the tradeoff between the communication overhead and
perturbation error of the aggregated data matrix (predicted matrix) at the fog
server. Experimental evaluation based on real healthcare data demonstrates that
the proposed scheme saves upto 95\% of the communication cost while maintaining
reasonable data privacy and low latency.Comment: 6 pages, 6 Figures, accepted for oral presentation in IEEE ICC 2019,
Internet of Things, Federated Learning and Perturbation theor
On Blow-up criterion for the Nonlinear Schr\"{o}dinger Equation
The blowup is studied for the nonlinear Schr\"{o}dinger equation
with is odd and (the
energy-critical or energy-supercritical case). It is shown that the solution
with negative energy blows up in finite or infinite time. A new
proof is also presented for the previous result in \cite{HoRo2}, in which a
similar result but more general in a case of energy-subcritical was shown.Comment: In this version, we add a reference, and change some expressions in
Englis
Robust Orthogonal Complement Principal Component Analysis
Recently, the robustification of principal component analysis has attracted
lots of attention from statisticians, engineers and computer scientists. In
this work we study the type of outliers that are not necessarily apparent in
the original observation space but can seriously affect the principal subspace
estimation. Based on a mathematical formulation of such transformed outliers, a
novel robust orthogonal complement principal component analysis (ROC-PCA) is
proposed. The framework combines the popular sparsity-enforcing and low rank
regularization techniques to deal with row-wise outliers as well as
element-wise outliers. A non-asymptotic oracle inequality guarantees the
accuracy and high breakdown performance of ROC-PCA in finite samples. To tackle
the computational challenges, an efficient algorithm is developed on the basis
of Stiefel manifold optimization and iterative thresholding. Furthermore, a
batch variant is proposed to significantly reduce the cost in ultra high
dimensions. The paper also points out a pitfall of a common practice of SVD
reduction in robust PCA. Experiments show the effectiveness and efficiency of
ROC-PCA in both synthetic and real data
Group Iterative Spectrum Thresholding for Super-Resolution Sparse Spectral Selection
Recently, sparsity-based algorithms are proposed for super-resolution
spectrum estimation. However, to achieve adequately high resolution in
real-world signal analysis, the dictionary atoms have to be close to each other
in frequency, thereby resulting in a coherent design. The popular convex
compressed sensing methods break down in presence of high coherence and large
noise. We propose a new regularization approach to handle model collinearity
and obtain parsimonious frequency selection simultaneously. It takes advantage
of the pairing structure of sine and cosine atoms in the frequency dictionary.
A probabilistic spectrum screening is also developed for fast computation in
high dimensions. A data-resampling version of high-dimensional Bayesian
Information Criterion is used to determine the regularization parameters.
Experiments show the efficacy and efficiency of the proposed algorithms in
challenging situations with small sample size, high frequency resolution, and
low signal-to-noise ratio
- β¦